Asymptotically Exact Error Analysis for the Generalized ℓ22-LASSO
نویسندگان
چکیده
Given an unknown signal x0 ∈ R and linear noisy measurements y = Ax0 + σv ∈ R, the generalized `2-LASSO solves x̂ := argminx 12‖y −Ax‖ 2 2 + σλf(x). Here, f is a convex regularization function (e.g. `1-norm, nuclearnorm) aiming to promote the structure of x0 (e.g. sparse, lowrank), and, λ ≥ 0 is the regularizer parameter. A related optimization problem, though not as popular or well-known, is often referred to as the generalized `2-LASSO and takes the form x̂ := argminx ‖y−Ax‖2+λf(x), and has been analyzed in [1]. [1] further made conjectures about the performance of the generalized `2-LASSO. This paper establishes these conjectures rigorously. We measure performance with the normalized squared error NSE(σ) := ‖x̂− x0‖2/σ. Assuming the entries of A and v be i.i.d. standard normal, we precisely characterize the “asymptotic NSE” aNSE := limσ→0 NSE(σ) when the problem dimensions m,n tend to infinity in a proportional manner. The role of λ, f and x0 is explicitly captured in the derived expression via means of a single geometric quantity, the Gaussian distance to the subdifferential. We conjecture that aNSE = supσ>0 NSE(σ). We include detailed discussions on the interpretation of our result, make connections to relevant literature and perform computational experiments that validate our theoretical findings.
منابع مشابه
Asymptotically Exact Error Analysis for the Generalized $\ell_2^2$-LASSO
Given an unknown signal x0 ∈ R and linear noisy measurements y = Ax0 + σv ∈ R, the generalized `2-LASSO solves x̂ := argminx 12‖y −Ax‖ 2 2 + σλf(x). Here, f is a convex regularization function (e.g. `1-norm, nuclearnorm) aiming to promote the structure of x0 (e.g. sparse, lowrank), and, λ ≥ 0 is the regularizer parameter. A related optimization problem, though not as popular or well-known, is of...
متن کاملConvergence theorems of multi-step iterative algorithm with errors for generalized asymptotically quasi-nonexpansive mappings in Banach spaces
The purpose of this paper is to study and give the necessary andsufficient condition of strong convergence of the multi-step iterative algorithmwith errors for a finite family of generalized asymptotically quasi-nonexpansivemappings to converge to common fixed points in Banach spaces. Our resultsextend and improve some recent results in the literature (see, e.g. [2, 3, 5, 6, 7, 8,11, 14, 19]).
متن کاملUnified LASSO Estimation via Least Squares Approximation
We propose a method of least squares approximation (LSA) for unified yet simple LASSO estimation. Our general theoretical framework includes ordinary least squares, generalized linear models, quantile regression, and many others as special cases. Specifically, LSA can transfer many different types of LASSO objective functions into their asymptotically equivalent least-squares problems. Thereaft...
متن کاملWeak and strong convergence theorems for a finite family of generalized asymptotically quasinonexpansive nonself-mappings
In this paper, we introduce and study a new iterative scheme toapproximate a common xed point for a nite family of generalized asymptoticallyquasi-nonexpansive nonself-mappings in Banach spaces. Several strong and weakconvergence theorems of the proposed iteration are established. The main resultsobtained in this paper generalize and rene some known results in the currentliterature.
متن کاملCombined l1 and greedy l0 penalized least squares for linear model selection
We introduce a computationally effective algorithm for a linear model selection consisting of three steps: screening–ordering–selection (SOS). Screening of predictors is based on the thresholded Lasso that is `1 penalized least squares. The screened predictors are then fitted using least squares (LS) and ordered with respect to their |t| statistics. Finally, a model is selected using greedy gen...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1502.06287 شماره
صفحات -
تاریخ انتشار 2015